Speech recognition experiments with a new multilayer LVQ network (MLVQ)

نویسنده

  • Gerhard Rigoll
چکیده

In this paper, a new neural network paradigm and its application to recognition of speech patterns is presented. The novel NN paradigm is a multilayer version of the well-known LVQ algorithm from Kohonen. The approach includes the following innovations and improvements compared to other popular neural network paradigms: 1) It is according to the knowledge of the author the first multilayer version of the classical LVQ algorithm, which is usually based on a one-layer neural network architecture. 2) It presents a new NN architecture, since it uses a perceptron-like propagation function for the hidden layers, and an Euclidean-like propagation function for the output layer. 3) Its architecture can be considered as an optimal compromise between the multilayer principle of an MLP and the principle of representing one class by several neurons adopted from classical LVQ. 4) Compared to MLP, the training of an MLVQ network with the same number of neurons is more effective, since only the weights of the winning neuron are updated, as in classical LVQ. 5) It outperforms both, the classical LVQ and the MLP algorithm in most experiments carried out. 6) In the initialization phase, the layers are trained hierarchically, making use of unsupervised information theory-based training algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Comparison of the Lbg, Lvq, Mlp, Som and Gmm Algorithms for Vector Quantisation and Clustering Analysis

We compare the performance of ve algorithms for vector quan-tisation and clustering analysis: the Self-Organising Map (SOM) and Learning Vector Quantization (LVQ) algorithms of Kohonen, the Linde-Buzo-Gray (LBG) algorithm, the MultiLayer Perceptron (MLP) and the GMM/EM algorithm for Gaussian Mixture Models (GMM). We propose that the GMM/EM provides a better representation of the speech space an...

متن کامل

Some Experiments On Face Recognition With Neural Networks

This paper presents some results on the possibilities offered by neural networks for human face recognition. In particular, two algorithms have been tested: learning vector quantization (LVQ) and multilayer perceptron (MLP). Two different approaches have been taken for each case, using as input data either preprocessed images (gray level or segmented), or geometrical features derived from a set...

متن کامل

Dialogue Act Recognition in Estonian Dialogues Using Artificial Neural Networks

This paper describes two experiments of applying dialogue act recognition to estonian dialogues. Two class systems were used in both of them—a general one (with 19 classes) and a detailed one (with 107 classes). In the first experiment the task was performed using learning vector quantization (LVQ). The preprocessing was done in WEBSOM (Self-Organizing Maps for Internet Exploration) style; the ...

متن کامل

Persian Phone Recognition Using Acoustic Landmarks and Neural Network-based variability compensation methods

Speech recognition is a subfield of artificial intelligence that develops technologies to convert speech utterance into transcription. So far, various methods such as hidden Markov models and artificial neural networks have been used to develop speech recognition systems. In most of these systems, the speech signal frames are processed uniformly, while the information is not evenly distributed ...

متن کامل

A New Multistage Lattice Vector Quantization with Adaptive Subband Thresholding for Image Compression

Lattice vector quantization (LVQ) reduces coding complexity and computation due to its regular structure. A new multistage LVQ (MLVQ) using an adaptive subband thresholding technique is presented and applied to image compression. The technique concentrates on reducing the quantization error of the quantized vectors by “blowing out” the residual quantization errors with an LVQ scale factor. The ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1995